496 research outputs found

    Stable Hebbian learning from spike timing-dependent plasticity

    Get PDF
    We explore a synaptic plasticity model that incorporates recent findings that potentiation and depression can be induced by precisely timed pairs of synaptic events and postsynaptic spikes. In addition we include the observation that strong synapses undergo relatively less potentiation than weak synapses, whereas depression is independent of synaptic strength. After random stimulation, the synaptic weights reach an equilibrium distribution which is stable, unimodal, and has positive skew. This weight distribution compares favorably to the distributions of quantal amplitudes and of receptor number observed experimentally in central neurons and contrasts to the distribution found in plasticity models without size-dependent potentiation. Also in contrast to those models, which show strong competition Changes in the synaptic connections between neurons are widely believed to contribute to memory storage, and the activitydependen

    Synaptic Scaling Balances Learning in a Spiking Model of Neocortex

    Full text link
    Learning in the brain requires complementary mechanisms: potentiation and activity-dependent homeostatic scaling. We introduce synaptic scaling to a biologically-realistic spiking model of neocortex which can learn changes in oscillatory rhythms using STDP, and show that scaling is necessary to balance both positive and negative changes in input from potentiation and atrophy. We discuss some of the issues that arise when considering synaptic scaling in such a model, and show that scaling regulates activity whilst allowing learning to remain unaltered.Comment: 10 page

    Attractor Metadynamics in Adapting Neural Networks

    Full text link
    Slow adaption processes, like synaptic and intrinsic plasticity, abound in the brain and shape the landscape for the neural dynamics occurring on substantially faster timescales. At any given time the network is characterized by a set of internal parameters, which are adapting continuously, albeit slowly. This set of parameters defines the number and the location of the respective adiabatic attractors. The slow evolution of network parameters hence induces an evolving attractor landscape, a process which we term attractor metadynamics. We study the nature of the metadynamics of the attractor landscape for several continuous-time autonomous model networks. We find both first- and second-order changes in the location of adiabatic attractors and argue that the study of the continuously evolving attractor landscape constitutes a powerful tool for understanding the overall development of the neural dynamics

    Reading out population codes with a matched filter

    Get PDF
    We study the optimal way to decode information present in a population code. Using a matched filter, the performance in Gaussian additive noise is as good as the theoretical maximum. The scheme can be applied when correlations among the neurons in the population are present. We show how the read out of the matched filter can be implemented in a neurophysiological realistic manner. The method seems advantageous for computations in layered networks

    An Improved Test for Detecting Multiplicative Homeostatic Synaptic Scaling

    Get PDF
    Homeostatic scaling of synaptic strengths is essential for maintenance of network “gain”, but also poses a risk of losing the distinctions among relative synaptic weights, which are possibly cellular correlates of memory storage. Multiplicative scaling of all synapses has been proposed as a mechanism that would preserve the relative weights among them, because they would all be proportionately adjusted. It is crucial for this hypothesis that all synapses be affected identically, but whether or not this actually occurs is difficult to determine directly. Mathematical tests for multiplicative synaptic scaling are presently carried out on distributions of miniature synaptic current amplitudes, but the accuracy of the test procedure has not been fully validated. We now show that the existence of an amplitude threshold for empirical detection of miniature synaptic currents limits the use of the most common method for detecting multiplicative changes. Our new method circumvents the problem by discarding the potentially distorting subthreshold values after computational scaling. This new method should be useful in assessing the underlying neurophysiological nature of a homeostatic synaptic scaling transformation, and therefore in evaluating its functional significance

    Summation of connectivity strengths in the visual cortex reveals stability of neuronal microcircuits after plasticity

    Get PDF
    Abstract : Background: Within sensory systems, neurons are continuously affected by environmental stimulation. Recently, we showed that, on cell-pair basis, visual adaptation modulates the connectivity strength between similarly tuned neurons to orientation and we suggested that, on a larger scale, the connectivity strength between neurons forming sub-networks could be maintained after adaptation-induced-plasticity. In the present paper, based on the summation of the connectivity strengths, we sought to examine how, within cell-assemblies, functional connectivity is regulated during an exposure-based adaptation. Results: Using intrinsic optical imaging combined with electrophysiological recordings following the reconfiguration of the maps of the primary visual cortex by long stimulus exposure, we found that within functionally connected cells, the summed connectivity strengths remain almost equal although connections among individual pairs are modified. Neuronal selectivity appears to be strongly associated with neuronal connectivity in a “homeodynamic” manner which maintains the stability of cortical functional relationships after experience-dependent plasticity. Conclusions: Our results support the “homeostatic plasticity concept” giving new perspectives on how the summation in visual cortex leads to the stability within labile neuronal ensembles, depending on the newly acquired properties by neurons
    • 

    corecore